FeFET-Based Binarized Neural Networks Under Temperature-Dependent Bit Errors

نویسندگان

چکیده

Ferroelectric FET (FeFET) is a highly promising emerging non-volatile memory (NVM) technology, especially for binarized neural network (BNN) inference on the low-power edge. The reliability of such devices, however, inherently depends temperature. Hence, changes in temperature during run time manifest themselves as bit error rates. In this work, we reveal temperature-dependent model FeFET memories, evaluate its effect BNN accuracy, and propose countermeasures. We begin transistor level accurately impact rates FeFET. This analysis reveals asymmetric Afterwards, application level, errors accuracy BNNs. Under errors, drops to unacceptable levels when no countermeasures are employed. two countermeasures: (1) Training BNNs tolerance by injecting flips into data, (2) applying rate assignment algorithm (BERA) which operates layer-wise manner does not inject training. experiments, BNNs, applied to, effectively tolerate entire range operating

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Binarized Neural Networks

In this work we introduce a binarized deep neural network (BDNN) model. BDNNs are trained using a novel binarized back propagation algorithm (BBP), which uses binary weights and binary neurons during the forward and backward propagation, while retaining precision of the stored weights in which gradients are accumulated. At test phase, BDNNs are fully binarized and can be implemented in hardware...

متن کامل

Embedded Binarized Neural Networks

We study embedded Binarized Neural Networks (eBNNs) with the aim of allowing current binarized neural networks (BNNs) in the literature to perform feedforward inference efficiently on small embedded devices. We focus on minimizing the required memory footprint, given that these devices often have memory as small as tens of kilobytes (KB). Beyond minimizing the memory required to store weights, ...

متن کامل

Attacking Binarized Neural Networks

Neural networks with low-precision weights and activations offer compelling efficiency advantages over their full-precision equivalents. The two most frequently discussed benefits of quantization are reduced memory consumption, and a faster forward pass when implemented with efficient bitwise operations. We propose a third benefit of very low-precision neural networks: improved robustness again...

متن کامل

Verification of Binarized Neural Networks

We study the problem of formal verification of Binarized Neural Networks (BNN), which have recently been proposed as a energyefficient alternative to traditional learning networks. The verification of BNNs, using the reduction to hardware verification, can be even more scalable by factoring computations among neurons within the same layer. By proving the NP-hardness of finding optimal factoring...

متن کامل

Verifying Properties of Binarized Deep Neural Networks

Understanding properties of deep neural networks is an important challenge in deep learning. In this paper, we take a step in this direction by proposing a rigorous way of verifying properties of a popular class of neural networks, Binarized Neural Networks, using the well-developed means of Boolean satisfiability. Our main contribution is a construction that creates a representation of a binar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Computers

سال: 2022

ISSN: ['1557-9956', '2326-3814', '0018-9340']

DOI: https://doi.org/10.1109/tc.2021.3104736